Multi-target Knowledge Distillation via Student Self-reflection

نویسندگان

چکیده

Abstract Knowledge distillation is a simple yet effective technique for deep model compression, which aims to transfer the knowledge learned by large teacher small student model. To mimic how teaches student, existing methods mainly adapt an unidirectional transfer, where extracted from different intermedicate layers of used guide However, it turns out that students can learn more effectively through multi-stage learning with self-reflection in real-world education scenario, nevertheless ignored current methods. Inspired this, we devise new framework entitled multi-target via or MTKD-SSR, not only enhance teacher’s ability unfolding be distilled, but also improve student’s capacity digesting knowledge. Specifically, proposed consists three target mechanisms: stage-wise channel (SCD), response (SRD), and cross-stage review (CRD), SCD SRD feature-based (i.e., features) response-based logits) at stages, respectively; CRD encourages conduct self-reflective after each stage self-distillation Experimental results on five popular visual recognition datasets, CIFAR-100, Market-1501, CUB200-2011, ImageNet, Pascal VOC, demonstrate significantly outperforms recent state-of-the-art

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Learner Reflection in Student Self-Assessment

Learner reflection has been found important and valuable, especially in cognitively intensive tasks, such as learning programming. This paper presents, K-Assess, a programming education system that facilitates student self-evaluation and promotes learner reflection. It is built on top of an existing conventional programming teaching system by adding a pragmatic knowledge layer. Our evaluations ...

متن کامل

Reflection perspectives of Tabriz Nursing Student

Introduction: The phenomenon of knowledge explosion has led teachers to feel the necessity of training students so that they become reflective thinkers. This issue is more important for nursing students who are responsible for providing care for patients.This study is a part of another study arming at exploration of Nursing Students’ views on reflection on practice. Methods. 20 senior nursing...

متن کامل

Sequence-Level Knowledge Distillation

Neural machine translation (NMT) offers a novel alternative formulation of translation that is potentially simpler than statistical approaches. However to reach competitive performance, NMT models need to be exceedingly large. In this paper we consider applying knowledge distillation approaches (Bucila et al., 2006; Hinton et al., 2015) that have proven successful for reducing the size of neura...

متن کامل

“You Have to Absorb Yourself in It”: Using Inquiry and Reflection to Promote Student Learning and Self-knowledge

Inspired by inquiry-guided learning and critical self-reflection as pedagogical approaches, we describe exercises that encourage students to develop critical thinking skills through inquiry and reflective writing. Students compile questions and reflections throughout the course and, at the end of the term, use their writings for a comprehensive analytic self-reflection that examines their intel...

متن کامل

Topic Distillation with Knowledge Agents

This is the second year that our group participates in TREC’s Web track. Our experiments focused on the Topic distillation task. Our main goal was to experiment with the Knowledge Agent (KA) technology [1], previously developed at our Lab, for this particular task. The knowledge agent approach was designed to enhance Web search results by utilizing domain knowledge. We first describe the generi...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: International Journal of Computer Vision

سال: 2023

ISSN: ['0920-5691', '1573-1405']

DOI: https://doi.org/10.1007/s11263-023-01792-z